An accelerated minimax algorithm for convex-concave saddle point problems with nonsmooth coupling function

نویسندگان

چکیده

Abstract In this work we aim to solve a convex-concave saddle point problem, where the coupling function is smooth in one variable and nonsmooth other not assumed be linear either. The problem augmented by regulariser component. We propose investigate novel algorithm under name of OGAProx , consisting an optimistic gradient ascent step coupled with proximal regulariser, which alternated component function. consider situations convex-concave, convex-strongly concave strongly related investigation. Regarding iterates obtain (weak) convergence, convergence rate order $$\mathcal {O}(\frac{1}{K})$$ O ( 1 K ) like {O}(\theta ^{K})$$ θ $$\theta < 1$$ < respectively. terms values ergodic rates {O}(\frac{1}{K^{2}})$$ 2 validate our theoretical considerations on nonsmooth-linear training multi kernel support vector machines classification incorporating minimax group fairness.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Accelerated HPE-Type Algorithm for a Class of Composite Convex-Concave Saddle-Point Problems

This article proposes a new algorithm for solving a class of composite convex-concave saddlepoint problems. The new algorithm is a special instance of the hybrid proximal extragradient framework in which a Nesterov’s accelerated variant is used to approximately solve the prox subproblems. One of the advantages of the new method is that it works for any constant choice of proximal stepsize. More...

متن کامل

An accelerated non-Euclidean hybrid proximal extragradient-type algorithm for convex-concave saddle-point problems

This paper describes an accelerated HPE-type method based on general Bregman distances for solving monotone saddle-point (SP) problems. The algorithm is a special instance of a non-Euclidean hybrid proximal extragradient framework introduced by Svaiter and Solodov [28] where the prox sub-inclusions are solved using an accelerated gradient method. It generalizes the accelerated HPE algorithm pre...

متن کامل

A simple algorithm for a class of nonsmooth convex-concave saddle-point problems

This supplementary material includes numerical examples demonstrating the flexibility and potential of the algorithm PAPC developed in the paper. We show that PAPC does behave numerically as predicted by the theory, and can efficiently solve problems which cannot be solved by well known state of the art algorithms sharing the same efficiency estimate. Here for illustration purposes, we compare ...

متن کامل

An Interior Point Algorithm for Solving Convex Quadratic Semidefinite Optimization Problems Using a New Kernel Function

In this paper, we consider convex quadratic semidefinite optimization problems and provide a primal-dual Interior Point Method (IPM) based on a new kernel function with a trigonometric barrier term. Iteration complexity of the algorithm is analyzed using some easy to check and mild conditions. Although our proposed kernel function is neither a Self-Regular (SR) fun...

متن کامل

Saddle Point Seeking for Convex Optimization Problems

In this paper, we consider convex optimization problems with constraints. By combining the idea of a Lie bracket approximation for extremum seeking systems and saddle point algorithms, we propose a feedback which steers a single-integrator system to the set of saddle points of the Lagrangian associated to the convex optimization problem. We prove practical uniform asymptotic stability of the se...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computational Optimization and Applications

سال: 2022

ISSN: ['0926-6003', '1573-2894']

DOI: https://doi.org/10.1007/s10589-022-00378-8